Correction to 'Lower Bounds on the VC-Dimension of Smoothly Parametrized Function Classes'

نویسندگان

  • Wee Sun Lee
  • Peter L. Bartlett
  • Robert C. Williamson
چکیده

The paper 3] gives lower bounds on the VC-dimension of various smoothly parametrized function classes. The results were proven by showing a relationship between the uniqueness of decision boundaries and the VC-dimension of smoothly parametrized function classes. The proof is incorrect; there is no such relationship under the conditions stated in 3]. For the case of neural networks with tanh activation functions, we give an alternative proof of a lower bound for the VC dimension proportional to the number of parameters which holds even when the magnitude of the parameters is restricted to be arbitrary small. Theorem 10 from 3] which was used to prove lower bounds on various neural network classes is incorrect. Theorem 10 ((3]) Let A be an open subset of R m , X be an open subset of R n and f : AX ! R be a continuously diierentiable function (in all of its arguments). Let F := ff(a;) : a 2 Ag. If there exists a k-dimensional manifold M A which has unique decision boundaries, then VCdim(F) k, where F is the thresholded function class formed from F. As counterexamples, observe that y = (a ? x) 2 has unique decision boundaries but VC dimension zero. Similarly y = (x ?a)(x?b) 2 has unique decision boundaries but VC dimension 1. In 3], Theorem 10 was used to prove a lower bound for the VC-dimension of neural networks with tanh activation functions. We can prove this lower bound more directly using Lemma 9 from 3].

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Error Bounds for Real Function Classes Based on Discretized Vapnik-Chervonenkis Dimensions

The Vapnik-Chervonenkis (VC) dimension plays an important role in statistical learning theory. In this paper, we propose the discretized VC dimension obtained by discretizing the range of a real function class. Then, we point out that Sauer’s Lemma is valid for the discretized VC dimension. We group the real function classes having the infinite VC dimension into four categories by using the dis...

متن کامل

Radial Basis Function Neural Networks Have Superlinear VC Dimension

We establish superlinear lower bounds on the Vapnik-Chervonen-kis (VC) dimension of neural networks with one hidden layer and local receptive eld neurons. As the main result we show that every reasonably sized standard network of radial basis function (RBF) neurons has VC dimension (W log k), where W is the number of parameters and k the number of nodes. This signiicantly improves the previousl...

متن کامل

Nearly-tight VC-dimension bounds for piecewise linear neural networks

We prove new upper and lower bounds on the VC-dimension of deep neural networks with the ReLU activation function. These bounds are tight for almost the entire range of parameters. Letting W be the number of weights and L be the number of layers, we prove that the VC-dimension is O(WL log(W )), and provide examples with VC-dimension Ω(WL log(W/L)). This improves both the previously known upper ...

متن کامل

VC Dimension Bounds for Higher-Order Neurons

We investigate the sample complexity for learning using higher-order neurons. We calculate upper and lower bounds on the Vapnik-Chervonenkis dimension and the pseudo dimension for higher-order neurons that allow unrestricted interactions among the input variables. In particular, we show that the degree of interaction is irrelevant for the VC dimension and that the individual degree of the varia...

متن کامل

Margin Analysis

In the last few lectures we have seen how to obtain high confidence bounds on the generalization error of functions learned from function classes of limited capacity, measured in terms of the growth function and VC-dimension for binary-valued function classes in the case of binary classification, and in terms of the covering numbers, pseudo-dimension, and fat-shattering dimension for real-value...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural Computation

دوره 9  شماره 

صفحات  -

تاریخ انتشار 1997